The Year 2000 problem (also known as the Y2K problem, the millennium bug, the Y2K bug, or simply Y2K) was a problem for both digital (computer-related) and non-digital documentation and data storage situations which resulted from the practice of abbreviating a four-digit year to two digits.
In computer programs, the practice of representing the year with two digits becomes problematic with logical error(s) arising upon "rollover" from x99 to x00. This has caused some date-related processing to operate incorrectly for dates and times on and after January 1, 2000 and on other critical dates which were billed "event horizons". Without corrective action, it was suggested that long-working systems would break down when the "...97, 98, 99, 00..." ascending numbering assumption suddenly became invalid. Companies and organizations worldwide checked, fixed, and upgraded their computer systems.
While no globally significant computer failures occurred when the clocks rolled over into 2000, preparation for the Y2K problem had a significant effect on the computer industry. There were plenty of Y2K problems, and that none of the glitches caused major incidents is seen as vindication of the Y2K preparation.[1] However, some questioned whether the absence of computer failures was the result of the preparation undertaken or whether the significance of the problem had been overstated.[1][2]
Many banks have responded to the Y2K problem by forcing full 4-digit year entries on check forms, which helps to prevent the error from occurring in accounting environments.
Contents |
Y2K was the common abbreviation for the year 2000 software problem. The abbreviation combines the letter Y for "year", and k for the SI unit prefix kilo meaning 1000; hence, 2K signifies 2000. It was also named the Millennium Bug because it was associated with the popular (rather than literal) roll-over of the millennium, despite the fact that the problem could have occurred at the end of any ordinary century.
The Year 2000 problem was the subject of the early book, Computers in Crisis by Jerome and Marilyn Murray (Petrocelli, 1984; reissued by McGraw-Hill under the title The Year 2000 Computing Crisis in 1996). The first recorded mention of the Year 2000 Problem on a Usenet newsgroup occurred Saturday, January 19, 1985 by Usenet poster Spencer Bolles.[3]
The acronym Y2K has been attributed to David Eddy, a Massachusetts programmer,[4] in an e-mail sent on June 12, 1995. He later said, "People were calling it CDC (Century Date Change), FADL (Faulty Date Logic) and other names."
Many computer programs stored years with only two digits; for example, 1980 would be stored as 80. Some such programs could not distinguish between the year 2000 and the year 1900. Other programs would try to represent the year 2000 as 19100. This could cause a complete failure and cause date comparisons to produce incorrect results. Some embedded systems, making use of similar date logic, were expected to fail and cause utilities and other crucial infrastructure to fail.
Some warnings of what would happen if nothing were done were particularly dire:
The Y2K problem is the electronic equivalent of the El Niño and there will be nasty surprises around the globe. — John Hamre, United States Deputy Secretary of Defense[5]
Special committees were set up by governments to monitor remedial work and contingency planning, particularly by crucial infrastructures such as telecommunications, utilities and the like, to ensure that the most critical services had fixed their own problems and were prepared for problems with others. While some commentators and experts argued that the coverage of the problem largely amounted to scaremongering,[2] it was only the safe passing of the main "event horizon" itself, January 1, 2000, that fully quelled public fears. Some experts who argued that scaremongering was occurring, such as Ross Anderson, Professor of Security Engineering at the University of Cambridge Computer Laboratory, have since claimed that despite sending out hundreds of press releases about research results suggesting that the problem was not likely to be as big a problem as some had suggested, they were largely ignored by the media.[2]
The practice of using two-digit dates for convenience predates computers.
“ | I'm one of the culprits who created this problem. I used to write those programs back in the 1960s and 1970s, and was proud of the fact that I was able to squeeze a few elements of space out of my program by not having to put a 19 before the year. Back then, it was very important. We used to spend a lot of time running through various mathematical exercises before we started to write our programs so that they could be very clearly delimited with respect to space and the use of capacity. It never entered our minds that those programs would have lasted for more than a few years. As a consequence, they are very poorly documented. If I were to go back and look at some of the programs I wrote 30 years ago, I would have one terribly difficult time working my way through step-by-step. | ” |
—Alan Greenspan, 1998[6] |
In the 1960s, computer memory and mass storage were scarce and expensive, and most data processing was done on punched cards which represented text data in 80-column records. Programming languages of the time, such as COBOL and RPG, processed numbers in their character representations. They occasionally used an extra bit called a zone punch to save one character for a minus sign on a negative number, or compressed two digits into one byte in a form called binary-coded decimal, but otherwise processed numbers as straight text. Over time the punched cards were converted to magnetic tape and then disk files and later to simple databases, but the structure of the programs usually changed very little. Popular software like dBase continued the practice of storing dates as text well into the 1980s and 1990s.
Saving two digits for every date field was significant in the 1960s. Since programs at that time were mostly short-lived to solve a specific problem, or control a specific hardware setup, neither managers nor programmers of that time expected their programs to remain in use for many decades. The realization that databases were a new type of program with different characteristics had not yet come, and hence most did not consider missing two digits of the year a significant problem.
There were exceptions, of course. The first person known to publicly address this issue was Bob Bemer, who had noticed it in 1958 as a result of work on genealogical software. He spent the next twenty years trying to make programmers, IBM, the US government and the ISO aware of the problem, with little result. This included the recommendation that the COBOL PICTURE clause should be used to specify four digit years for dates. This could have been done by programmers at any time from the initial release of the first COBOL compiler in 1961 onwards. However, lack of foresight, the desire to save storage space, and overall complacency prevented this advice from being followed. Despite magazine articles on the subject from 1970 onwards, the majority of programmers only started recognizing Y2K as a looming problem in the mid-1990s, but even then, inertia and complacency caused it to be mostly unresolved until the last few years of the decade. In 1989 Erik Naggum was instrumental in ensuring that Internet mail used four digit representations of years by including a strong recommendation to this effect in the Internet host requirements document RFC 1123.[7]
Storage of a combined date and time within a fixed binary field is often considered a solution, but the possibility for software to misinterpret dates remains because such date and time representations must be relative to some known origin. Rollover of such systems is still a problem but can happen at varying dates and can fail in various ways. For example:
January 0, 1900
in previous versions, the year 1900 is still regarded as a leap year to maintain backward compatibility.Even before 1 January 2000 arrived, there were also some worries about 9 September 1999 (albeit lesser compared to those generated by Y2K). Because this date could also be written in the numeric format 9/9/99, it could have conflicted with the date value 9999
, frequently used to specify an unknown date. It was thus possible that database programs might act on the records containing unknown dates on that day.[11] Somewhat similar to this is the end-of-file code 9999
, used in older programming languages. While fears arose that some programs might unexpectedly terminate on that date, the bug was more likely to confuse computer operators than machines.
Mostly, a year is a leap year if it is evenly divisible by four. A year divisible by 100, however, is not a leap year on the Gregorian calendar unless it is also divisible by 400. For example, 1600 is a leap year, but 1700, 1800 and 1900 are not. Some programs may have relied on the oversimplified rule that a year divisible by four is a leap year. This method works fine for the year 2000 (because it is a leap year), and will not become a problem until 2100, when older legacy programs will likely have long since been replaced. For information on why century years are treated differently, see Gregorian calendar.
Some systems had problems once the year rolled over to 2010. This was dubbed by some in the media as the "Y2K+10" or "Y2.01k" problem.[12]
The main source of problems was confusion between binary number encoding and Binary-coded decimal (BCD) encodings of numbers. Both binary and BCD encode the numbers 0–9 as 0x0–0x9. But BCD encodes the number 10 as 0x10, whereas binary encodes the number 10 as 0x0A; 0x10 interpreted as a binary encoding represents the number 16. For example, because the SMS protocol uses BCD for dates, some mobile phone software incorrectly reported dates of SMSes as 2016 instead of 2010.
Among the affected systems were EFTPOS terminals,[13] specific mobile phones[14] and older Sony PlayStation 3 models (except the Slim).[15]
Windows Mobile is the first reported software to get this glitch as it changes the sent date of any phone message sent after 1 January 2010 from the year "2010" to "2016"[16] The most important such glitch occurred in Germany, where upwards of 20 million bank cards became unusable, and with Citibank Belgium, whose digipass customer identification chips stopped working[17]
The original Unix timestamp (time_t
) stores a date and time as a 32-bit integer representing the number of seconds since January 1, 1970. During and after 2038, this number will exceed 32 bits, causing the Year 2038 problem (also known as Unix Millennium bug, or Y2K38). To solve this problem, many systems and languages have switched to a 64-bit timestamp, or supplied alternatives which are 64-bit.
Ambiguity and errors can rise when different methods of ordering a day/month/year sequence are used by different entities. For example, 30/11/05 could result in the interpretations of November 5, 2030; November 30, 2005, May 30, 2011, or even May 11, 2030. Any abbreviated year in 2001-2031 can lead to the error. Standardising the date format (as in ISO 8601: yyyy-mm-dd) is difficult due to the open nature of writing, international differences, and human behaviour.
Several very different approaches were used to solve the Year 2000 problem in legacy systems. Three of them follow:
When 1 January 2000 arrived, there were problems generally regarded as minor. Problems did not always have to occur precisely at midnight. Some programs were not active at that moment and would only show up when they were invoked. Not all problems recorded were directly linked to Y2K programming in a causality; minor technological glitches occur on a regular basis.
Reported problems include:
The United States Government responded to the Y2K threat by passing the Year 2000 Information and Readiness Disclosure Act, by working with private sector counterparts in order to ensure readiness, and by creating internal continuity of operations plans in the event of problems. The effort was coordinated out of the White House by the President's Council On Year 2000 Conversion, headed by John Koskinen.[24] The White House effort was conducted in coordination with the then-independent Federal Emergency Management Agency (FEMA), and an interim Critical Infrastructure Protection Group, then in the Department of Justice, now in Homeland Security. The Dutch Government promoted Y2K Information Sharing and Analysis Centers (ISACs) to share readiness between industries, without threat of antitrust violations or liability based on information shared.
The US Government followed a three part approach to the problem: (1) Outreach and Advocacy (2) Monitoring and Assessment and (3) Contingency Planning and Regulation.[25]
A feature of US Government outreach was Y2K websites including Y2K.GOV. Presently, many US Government agencies have taken down their Y2K websites. Some of these documents may be available through National Archives and Records Administration[26] or The Wayback Machine.
Each federal agency had its own Y2K task force which worked with its private sector counter parts. The FCC had the FCC Year 2000 Task Force.[25][27]
Most industries had contingency plans that relied upon the Internet for backup communications. However, as no federal agency had clear authority with regard to the Internet at this time (it had passed from the US Department of Defense to the US National Science Foundation and then to the US Department of Commerce), no agency was assessing the readiness of the Internet itself. Therefore on July 30, 1999 the White House held the White House Internet Y2K Roundtable.[28]
The International Y2K Cooperation Center (IY2KCC) was established at the behest of national Y2K coordinators from over 120 countries when they met at the First Global Meeting of National Y2K Coordinators at the United Nations in December 1988. IY2KCC established an office in Washington, D.C. in March 1999. Funding was provided by the World Bank, and Bruce W. McConnell was appointed as director.
IY2KCC's mission was to "promote increased strategic cooperation and action among governments, peoples, and the private sector to minimize adverse Y2K effects on the global society and economy." Activities of IY2KCC were conducted in six areas:
IY2KCC closed down in March 2000.[29]
Norway and Finland changed their National identification number, to indicate the century in which a person was born. In both countries the birth year was indicated with two digits only. However, a similar problem already existed, the "Year 1900 problem", about distinguishing between people born in the 20th or 19th century, so the timing was more because of the Y2K attention than a solution to a new problem. In Finland the problem was solved by replacing the hyphen '-' in the number with the letter 'A' for people born in the 21st century. In Norway, the range or the individual numbers following the birth date was altered from 0-499 to 500-999.
The Y2K issue was a major topic of discussion in the late 1990s and, predictably, showed up in most popular media. A number of "Y2K disaster" books were published such as Deadline Y2K by Mark Joseph. Movies such as Y2K: Year to Kill capitalized on the currency of Y2K, as did numerous TV shows, comic strips, and computer games.
The total cost of the work done in preparation for Y2K is estimated at over 300 billion US dollars.[31] There are two ways to view the events of 2000 from the perspective of its aftermath:
This view holds that the vast majority of problems had been fixed correctly, and the money was well spent. The situation was essentially one of preemptive alarm. Those who hold this view claim that the lack of problems at the date change reflect the completeness of the project, and that many computer applications would not have continued to function into the 21st century without correction or remediation.
Others have claimed that there were no, or very few, critical problems to begin with, and that correcting the few minor mistakes as they occurred (the 'fix on failure' approach) would have been the most efficient and cost effective way to solve the problem. Editorial writing in the Wall Street Journal called Y2K an "end-of-the-world cult" and the "hoax of the century".[36] The opposing view was bolstered by a number of observations.
|